# Unsupervised Training

Laprador Untrained
This is a sentence similarity model based on sentence-transformers, capable of mapping text to a 768-dimensional vector space
Text Embedding Transformers
L
gemasphi
31
0
Unsup Simcse Bert Base Uncased
Unsupervised contrastive learning model based on BERT architecture, improving sentence embedding quality through a simple yet effective contrastive learning framework
Text Embedding
U
princeton-nlp
9,546
5
All T5 Base V1
Apache-2.0
T5-based doc2query model for document expansion and training data generation
Text Generation Transformers English
A
doc2query
171
10
Simcse Chinese Roberta Wwm Ext
A simplified Chinese sentence embedding encoding model based on simple contrastive learning, using the Chinese RoBERTa WWM extended version as the pre-trained model.
Text Embedding Transformers
S
cyclone
188
32
Contriever Msmarco
A fine-tuned version of the Contriever pre-trained model, optimized for dense information retrieval tasks and trained using contrastive learning methods
Text Embedding Transformers
C
facebook
24.08k
27
Distilbert Base Uncased Go Emotions Student
MIT
An emotion classification model distilled from unlabeled GoEmotions dataset through zero-shot classification pipeline, serving as a computationally efficient proof-of-concept model
Text Classification Transformers English
D
joeddav
143.01k
76
SBERT Large Nli V2
SBERT-large-nli-v2 is a large-scale sentence transformer model based on BERT, specifically designed for sentence similarity calculation and feature extraction.
Text Embedding Transformers
S
Muennighoff
43
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase